Day6: Interaction centered not on Human, but Objects

Background

Artificial sensing is a term that really shocked me when I first heard of. Because it uses sensing, an only human-owned, or living being owned term to describe computers and other ‘dead’ artificial objects. However, artificial sensing are actually very different from our perspective of sensing. They can look, but look at a much longer span on the light wave span, which means they can see much more than us. They can voice/hear, they can sense smell, temperature, they can move around, etc. And they even have many more ways of sensing, like through wireless radio wave or linked directly through cable wires. Their ways of sensing are much richer than we human beings.


Idea!

I want to design two correlated objects that use their own ways of sensing(like talking/touching/seeing) that we humans can’t understand. After all, it’s their language, objects language.


To-do List

A more concerete idea. A raw prototype may be needed.

Reflection

This speculative scenario is kind of frightening because we can imagine with such a vast majority of techniques, how future AI may cheat us in ways we may never know. We have no match with them when it comes to sensing the world. However, maybe thinking in this way is just biased because as mentioned above, AI and many other artifacts are simply not humans, we can’t think of them in the way we think.